A Sound and Complete Algorithm for Learning Causal Models from Relational Data
نویسندگان
چکیده
The PC algorithm learns maximally oriented causal Bayesian networks. However, there is no equivalent complete algorithm for learning the structure of relational models, a more expressive generalization of Bayesian networks. Recent developments in the theory and representation of relational models support lifted reasoning about conditional independence. This enables a powerful constraint for orienting bivariate dependencies and forms the basis of a new algorithm for learning structure. We present the relational causal discovery (RCD) algorithm that learns causal relational models. We prove that RCD is sound and complete, and we present empirical results that demonstrate effectiveness.
منابع مشابه
On Learning Causal Models from Relational Data
Many applications call for learning causal models from relational data. We investigate Relational Causal Models (RCM) under relational counterparts of adjacency-faithfulness and orientation-faithfulness, yielding a simple approach to identifying a subset of relational d-separation queries needed for determining the structure of an RCM using d-separation against an unrolled DAG representation of...
متن کاملA Characterization of Markov Equivalence Classes of Relational Causal Models under Path Semantics
Relational Causal Models (RCM) generalize Causal Bayesian Networks so as to extend causal discovery to relational domains. We provide a novel and elegant characterization of the Markov equivalence of RCMs under path semantics. We introduce a novel representation of unshielded triples that allows us to efficiently determine whether an RCM is Markov equivalent to another. Under path semantics, we...
متن کاملFlattening network data for causal discovery: What could wrong?
Methods for learning causal dependencies from observational data have been the focus of decades of work in social science, statistics, machine learning, and philosophy [9, 10, 11]. Much of the theoretical and practical work on causal discovery has focused on propositional representations. Propositional models effectively represent individual directed causal dependencies (e.g., path analysis, Ba...
متن کاملLearning the Structure of Causal Models with Relational and Temporal Dependence
Many real-world domains are inherently relational and temporal—they consist of heterogeneous entities that interact with each other over time. Effective reasoning about causality in such domains requires representations that explicitly model relational and temporal dependence. In this work, we provide a formalization of temporal relational models. We define temporal extensions to abstract groun...
متن کاملNon-intuitive conditional independence facts hold in models of network data
Many social scientists and researchers across a wide range of fields focus on analyzing a single causal dependency or a conditional model of some outcome variable. However, to reason about interventions or conditional independence, it is useful to construct a joint model of a domain. Researchers in computer science, statistics, and philosophy have developed representations (e.g., Bayesian netwo...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
عنوان ژورنال:
- CoRR
دوره abs/1309.6843 شماره
صفحات -
تاریخ انتشار 2013